Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
googleapis
Advanced tools
The googleapis npm package is a client library for accessing various Google APIs. It provides an easy way to integrate Google services into applications, allowing developers to interact with a wide range of Google services, including Google Drive, Gmail, Google Calendar, YouTube, and many more.
Google Drive API
This code sample demonstrates how to list files in a user's Google Drive using the Google Drive API.
const { google } = require('googleapis');
const drive = google.drive({ version: 'v3', auth });
// List files in Google Drive
drive.files.list({}, (err, res) => {
if (err) throw err;
const files = res.data.files;
if (files.length) {
console.log('Files:');
files.map((file) => {
console.log(`${file.name} (${file.id})`);
});
} else {
console.log('No files found.');
}
});
Gmail API
This code sample shows how to send an email using the Gmail API.
const { google } = require('googleapis');
const gmail = google.gmail({ version: 'v1', auth });
// Send an email using the Gmail API
gmail.users.messages.send({
userId: 'me',
requestBody: {
raw: emailMessage
}
}, (err, res) => {
if (err) throw err;
console.log('Email sent:', res.data);
});
Google Calendar API
This code sample illustrates how to create a new event in a user's Google Calendar.
const { google } = require('googleapis');
const calendar = google.calendar({ version: 'v3', auth });
// Insert a new calendar event
calendar.events.insert({
calendarId: 'primary',
resource: event,
}, (err, event) => {
if (err) throw err;
console.log('Event created: %s', event.htmlLink);
});
YouTube API
This code sample demonstrates how to search for videos on YouTube using the YouTube API.
const { google } = require('googleapis');
const youtube = google.youtube({ version: 'v3', auth });
// Search for videos on YouTube
youtube.search.list({
part: 'snippet',
q: 'Node.js on Google Cloud',
maxResults: 10
}, (err, response) => {
if (err) throw err;
const videos = response.data.items;
if (videos.length) {
console.log('Search results:');
videos.forEach((video) => {
console.log(`${video.snippet.title}`);
});
} else {
console.log('No search results found.');
}
});
The aws-sdk package is a client library for Amazon Web Services (AWS). It allows developers to interact with a wide range of AWS services such as Amazon S3, EC2, DynamoDB, and more. While it serves a similar purpose for AWS as googleapis does for Google services, the two are for different ecosystems.
Node.js client library for using Google APIs. Support for authorization and authentication with OAuth 2.0, API Keys and JWT tokens is included.
The full list of supported APIs can be found on the Google APIs Explorer. The API endpoints are automatically generated, so if the API is not in the list, it is currently not supported by this API client library.
If you're working with Google Cloud Platform APIs such as Datastore, Cloud Storage or Pub/Sub, consider using the @google-cloud
client libraries: single purpose idiomatic Node.js clients for Google Cloud Platform services.
These client libraries are officially supported by Google. However, these libraries are considered complete and are in maintenance mode. This means that we will address critical bugs and security issues but will not add any new features. For Google Cloud Platform APIs, we recommend using google-cloud-node which is under active development.
This library supports the maintenance LTS, active LTS, and current release of node.js. See the node.js release schedule for more information.
This library is distributed on npm
. In order to add it as a dependency, run the following command:
$ npm install googleapis
This is a very simple example. This creates a Blogger client and retrieves the details of a blog given the blog Id:
const {google} = require('googleapis');
// Each API may support multiple version. With this sample, we're getting
// v3 of the blogger API, and using an API key to authenticate.
const blogger = google.blogger({
version: 'v3',
auth: 'YOUR API KEY'
});
const params = {
blogId: 3213900
};
// get the blog details
blogger.blogs.get(params, (err, res) => {
if (err) {
console.error(err);
throw err;
}
console.log(`The blog url is ${res.data.url}`);
});
Instead of using callbacks you can also use promises!
blogger.blogs.get(params)
.then(res => {
console.log(`The blog url is ${res.data.url}`);
})
.catch(error => {
console.error(error);
});
Or async/await:
async function runSample() {
const res = await blogger.blogs.get(params);
console.log(`The blog url is ${res.data.url}`);
}
runSample().catch(console.error);
There are a lot of samples 🤗 If you're trying to figure out how to use an API ... look there first! If there's a sample you need missing, feel free to file an issue.
This library has a full set of API Reference Documentation. This documentation is auto-generated, and the location may change.
There are three primary ways to authenticate to Google APIs. Some service support all authentication methods, other may only support one or two.
OAuth2 - This allows you to make API calls on behalf of a given user. In this model, the user visits your application, signs in with their Google account, and provides your application with authorization against a set of scopes. Learn more.
Service <--> Service - In this model, your application talks directly to Google APIs using a Service Account. It's useful when you have a backend application that will talk directly to Google APIs from the backend. Learn more.
API Key - With an API key, you can access your service from a client or the server. Typically less secure, this is only available on a small subset of services with limited scopes. Learn more.
To learn more about the authentication client, see the Google Auth Library.
This client comes with an OAuth2 client that allows you to retrieve an access token, refresh it, and retry the request seamlessly. The basics of Google's OAuth2 implementation is explained on Google Authorization and Authentication documentation.
In the following examples, you may need a CLIENT_ID
, CLIENT_SECRET
and REDIRECT_URL
. You can find these pieces of information by going to the Developer Console, clicking your project --> APIs & auth --> credentials.
For more information about OAuth2 and how it works, see here.
A complete sample application that authorizes and authenticates with the OAuth2 client is available at samples/oauth2.js
.
To ask for permissions from a user to retrieve an access token, you redirect them to a consent page. To create a consent page URL:
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
// generate a url that asks permissions for Blogger and Google Calendar scopes
const scopes = [
'https://www.googleapis.com/auth/blogger',
'https://www.googleapis.com/auth/calendar'
];
const url = oauth2Client.generateAuthUrl({
// 'online' (default) or 'offline' (gets refresh_token)
access_type: 'offline',
// If you only need one scope you can pass it as a string
scope: scopes
});
IMPORTANT NOTE - The refresh_token
is only returned on the first authorization. More details here.
Once a user has given permissions on the consent page, Google will redirect the page to the redirect URL you have provided with a code query parameter.
GET /oauthcallback?code={authorizationCode}
With the code returned, you can ask for an access token as shown below:
// This will provide an object with the access_token and refresh_token.
// Save these somewhere safe so they can be used at a later time.
const {tokens} = await oauth2Client.getToken(code)
oauth2Client.setCredentials(tokens);
With the credentials set on your OAuth2 client - you're ready to go!
Access tokens expire. This library will automatically use a refresh token to obtain a new access token if it is about to expire. An easy way to make sure you always store the most recent tokens is to use the tokens
event:
oauth2Client.on('tokens', (tokens) => {
if (tokens.refresh_token) {
// store the refresh_token in my database!
console.log(tokens.refresh_token);
}
console.log(tokens.access_token);
});
To set the refresh_token
at a later time, you can use the setCredentials
method:
oauth2Client.setCredentials({
refresh_token: `STORED_REFRESH_TOKEN`
});
Once the client has a refresh token, access tokens will be acquired and refreshed automatically in the next call to the API.
You may need to send an API key with the request you are going to make. The following uses an API key to make a request to the Blogger API service to retrieve a blog's name, url, and its total amount of posts:
const {google} = require('googleapis');
const blogger = google.blogger_v3({
version: 'v3',
auth: 'YOUR_API_KEY' // specify your API key here
});
const params = {
blogId: 3213900
};
async function main(params) {
const res = await blogger.blogs.get({blogId: params.blogId});
console.log(`${res.data.name} has ${res.data.posts.totalItems} posts! The blog url is ${res.data.url}`)
};
main().catch(console.error);
To learn more about API keys, please see the documentation.
Rather than manually creating an OAuth2 client, JWT client, or Compute client, the auth library can create the correct credential type for you, depending upon the environment your code is running under.
For example, a JWT auth client will be created when your code is running on your local developer machine, and a Compute client will be created when the same code is running on a configured instance of Google Compute Engine.
The code below shows how to retrieve a default credential type, depending upon the runtime environment. The createScopedRequired must be called to determine when you need to pass in the scopes manually, and when they have been set for you automatically based on the configured runtime environment.
const {google} = require('googleapis');
const compute = google.compute('v1');
async function main () {
// This method looks for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS
// environment variables.
const auth = new google.auth.GoogleAuth({
// Scopes can be specified either as an array or as a single, space-delimited string.
scopes: ['https://www.googleapis.com/auth/compute']
});
const authClient = await auth.getClient();
// obtain the current project Id
const project = await auth.getProjectId();
// Fetch the list of GCE zones within a project.
const res = await compute.zones.list({ project, auth: authClient });
console.log(res.data);
}
main().catch(console.error);
You can set the auth
as a global or service-level option so you don't need to specify it every request. For example, you can set auth
as a global option:
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
// set auth as a global default
google.options({
auth: oauth2Client
});
Instead of setting the option globally, you can also set the authentication client at the service-level:
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
const drive = google.drive({
version: 'v2',
auth: oauth2Client
});
See the Options section for more information.
The body of the request is specified in the requestBody
parameter object of the request. The body is specified as a JavaScript object with key/value pairs. For example, this sample creates a watcher that posts notifications to a Google Cloud Pub/Sub topic when emails are sent to a gmail account:
const res = await gmail.users.watch({
userId: 'me',
requestBody: {
// Replace with `projects/${PROJECT_ID}/topics/${TOPIC_NAME}`
topicName: `projects/el-gato/topics/gmail`
}
});
console.log(res.data);
This client supports multipart media uploads. The resource parameters are specified in the requestBody
parameter object, and the media itself is specified in the media.body
parameter with mime-type specified in media.mimeType
.
This example uploads a plain text file to Google Drive with the title "Test" and contents "Hello World".
const drive = google.drive({
version: 'v3',
auth: oauth2Client
});
const res = await drive.files.create({
requestBody: {
name: 'Test',
mimeType: 'text/plain'
},
media: {
mimeType: 'text/plain',
body: 'Hello World'
}
});
You can also upload media by specifying media.body
as a Readable stream. This can allow you to upload very large files that cannot fit into memory.
const fs = require('fs');
const drive = google.drive({
version: 'v3',
auth: oauth2Client
});
async function main() {
const res = await drive.files.create({
requestBody: {
name: 'testimage.png',
mimeType: 'image/png'
},
media: {
mimeType: 'image/png',
body: fs.createReadStream('awesome.png')
}
});
console.log(res.data);
}
main().catch(console.error);
For more examples of creation and modification requests with media attachments, take a look at the samples/drive/upload.js
sample.
For more fine-tuned control over how your API calls are made, we provide you with the ability to specify additional options that can be applied directly to the 'gaxios' object used in this library to make network calls to the API.
You may specify additional options either in the global google
object or on a service client basis. The options you specify are attached to the gaxios
object so whatever gaxios
supports, this library supports. You may also specify global or per-service request parameters that will be attached to all API calls you make.
A full list of supported options can be [found here][requestopts].
You can choose default options that will be sent with each request. These options will be used for every service instantiated by the google client. In this example, the timeout
property of GaxiosOptions
will be set for every request:
const {google} = require('googleapis');
google.options({
// All requests made with this object will use these settings unless overridden.
timeout: 1000,
auth: auth
});
You can also modify the parameters sent with each request:
const {google} = require('googleapis');
google.options({
// All requests from all services will contain the above query parameter
// unless overridden either in a service client or in individual API calls.
params: {
quotaUser: 'user123@example.com'
}
});
You can also specify options when creating a service client.
const blogger = google.blogger({
version: 'v3',
// All requests made with this object will use the specified auth.
auth: 'API KEY';
});
By doing this, every API call made with this service client will use 'API KEY'
to authenticate.
Note: Created clients are immutable so you must create a new one if you want to specify different options.
Similar to the examples above, you can also modify the parameters used for every call of a given service:
const blogger = google.blogger({
version: 'v3',
// All requests made with this service client will contain the
// blogId query parameter unless overridden in individual API calls.
params: {
blogId: 3213900
}
});
// Calls with this drive client will NOT contain the blogId query parameter.
const drive = google.drive('v3');
...
You can specify an auth
object to be used per request. Each request also inherits the options specified at the service level and global level.
For example:
const {google} = require('googleapis');
const bigquery = google.bigquery('v2');
async function main() {
// This method looks for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS
// environment variables.
const auth = new google.auth.GoogleAuth({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
});
const authClient = await auth.getClient();
const projectId = await auth.getProjectId();
const request = {
projectId,
datasetId: '<YOUR_DATASET_ID>',
// This is a "request-level" option
auth: authClient
};
const res = await bigquery.datasets.delete(request);
console.log(res.data);
}
main().catch(console.error);
You can also override gaxios options per request, such as url
, method
, and responseType
.
For example:
const res = await drive.files.export({
fileId: 'asxKJod9s79', // A Google Doc
mimeType: 'application/pdf'
}, {
// Make sure we get the binary data
responseType: 'stream'
});
You can use the following environment variables to proxy HTTP and HTTPS requests:
HTTP_PROXY
/ http_proxy
HTTPS_PROXY
/ https_proxy
When HTTP_PROXY / http_proxy are set, they will be used to proxy non-SSL requests that do not have an explicit proxy configuration option present. Similarly, HTTPS_PROXY / https_proxy will be respected for SSL requests that do not have an explicit proxy configuration option. It is valid to define a proxy in one of the environment variables, but then override it for a specific request, using the proxy configuration option.
You can programatically obtain the list of supported APIs, and all available versions:
const {google} = require('googleapis');
const apis = google.getSupportedAPIs();
This will return an object with the API name as object property names, and an array of version strings as the object values;
This library is written in TypeScript, and provides types out of the box. All classes and interfaces generated for each API are exported under the ${apiName}_${version}
namespace. For example, the Drive v3 API types are all available from the drive_v3
namespace:
import { drive_v3 } from 'googleapis';
You can find a detailed list of breaking changes and new features in our Release Notes. If you've used this library before 25.x
, see our Release Notes to learn about migrating your code from 24.x.x
to 25.x.x
. It's pretty easy :)
This library is licensed under Apache 2.0. Full license text is available in LICENSE.
We love contributions! Before submitting a Pull Request, it's always good to start with a new issue first. To learn more, see CONTRIBUTING.
FAQs
Google APIs Client Library for Node.js
The npm package googleapis receives a total of 1,783,499 weekly downloads. As such, googleapis popularity was classified as popular.
We found that googleapis demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.